WSNet: Compact and Efficient Networks with Weight Sampling
نویسندگان
چکیده
We present a new approach and a novel architecture, termed WSNet, for learning compact and efficient deep neural networks. Existing approaches conventionally learn full model parameters independently at first and then compress them via ad hoc processing like model pruning or filter factorization. Different from them, WSNet proposes learning model parameters by sampling from a compact set of learnable parameters, which naturally enforces parameter sharing throughout the learning process. We show that such novel weight sampling approach (and induced WSNet) promotes both weights and computation sharing favorably. It can more efficiently learn much smaller networks with competitive performance, compared to baseline networks with equal number of convolution filters. Specifically, we consider learning compact and efficient 1D convolutional neural networks for audio classification. Extensive experiments on multiple audio classification datasets verify the effectiveness of WSNet. Combined with weight quantization, the resulted models are up to 180× smaller and theoretically up to 16× faster than the well-established baselines, without noticeable performance drop.
منابع مشابه
Web Services-based network management: approaches and the WSNET system
While the Simple Network Management Protocol (SNMP) is still the dominant protocol for managing network elements in IP-based networks and the Internet, network managers are acknowledging its limitations with respect to configuration management, application development and decentralization of management tasks. Web Services (WS) have been recently proposed to alleviate these limitations, given th...
متن کاملAdding Network Coding Capabilities to the WSNet Simulator
This technical report presents the implementation of a Network Coding module in WSNet a Wireless Sensor Network simulator. This implementation provides a generic programming interface to allow an easy specialization of different coding strategies: random, source/destination-oriented, intra/inter-flow, etc. Key-words: Network Coding, Wireless Sensor Network (WSN), Simulation, WSNet ∗ University ...
متن کاملOn the Compactness, Efficiency, and Representation of 3D Convolutional Networks: Brain Parcellation as a Pretext Task
Deep convolutional neural networks are powerful tools for learning visual representations from images. However, designing efficient deep architectures to analyse volumetric medical images remains challenging. This work investigates efficient and flexible elements of modern convolutional networks such as dilated convolution and residual connection. With these essential building blocks, we propos...
متن کاملPerfect sampling for closed queuing networks
In this paper we investigate coupling from the past (CFTP) algorithms for closed queuing networks. The stationary distribution has a product form only in a very limited number of particular cases when queue capacity is finite, and numerical algorithms are intractable due to the cardinality of the state space. Moreover, closed networks do not exhibit any monotonic property enabling efficient CFT...
متن کاملEfficient Neural Audio Synthesis
Sequential models achieve state-of-the-art results in audio, visual and textual domains with respect to both estimating the data distribution and generating high-quality samples. Efficient sampling for this class of models has however remained an elusive problem. With a focus on text-to-speech synthesis, we describe a set of general techniques for reducing sampling time while maintaining high o...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1711.10067 شماره
صفحات -
تاریخ انتشار 2017